Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 32
Filter
1.
Vet Parasitol ; 327: 110120, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38266372

ABSTRACT

Parasites are known for their ability to rapidly adapt to changing conditions. For parasitic helminths, changes in climate, along with farming and management practices associated with the intensification of livestock farming, provide novel challenges which can impact on their epidemiology and control. The sustainability of livestock production partially relies on effective control of helminth infection. Therefore, understanding changes in parasite behaviour, and what drives these, is of great importance. Nematodirus battus is an economically important helminth in the UK and temperate regions. Its infective larvae typically overwinter in eggs on pasture and hatch synchronously in spring, causing acute disease in lambs. Attempts to control disease typically rely on whole-flock benzimidazole (BZ) treatments. In recent years, the emergence of BZ-resistance, alongside the hatching of eggs without the classical over-winter 'chill stimulus', have made N. battus more difficult to control. In three previous studies, after collecting a large number of N. battus populations alongside farm management data from commercial farms, we explored the prevalence of genetic mutations associated with BZ-resistance (n = 253 farms), the ability of eggs to hatch with and without a chill stimulus (n = 90 farms) and how farm management practices varied throughout the UK (n = 187 farms). In the present study, we identify factors which may be acting as drivers, or barriers, to either the development of resistance or the variable hatching behaviour of N. battus eggs. Generalised linear mixed effect models were applied to regress experimental hatching and genotyping data on farm management and additional environmental data. Both variable hatching and resistance development appeared associated with the maintenance of parasite refugia as well as grazing management, particularly reseeding of pasture routinely grazed by young lambs each spring and the practice of set-stocked grazing. Effective quarantine measures were identified as the main protective factor for the development of BZ-resistance whereas set stocked grazing and population bottlenecks, resulting from reseeding heavily contaminated pastures, were risk factors. Spring maximum temperature and other climatic factors were associated with 'typical' hatching of eggs following a chill stimulus whilst several management factors were linked with hatching without prior chilling. For example, practices which reduce parasite numbers on pasture (e.g. re-seeding) or restrict availability of hosts (e.g. resting fields), were found to increase the odds of non-chill hatching. Retention of the timing of lambing and infection level of the host within the fitted model indicated that requirement for a chill stimulus prior to hatching may be plastic, perhaps subject to change throughout the grazing season, in response to immune development or parasite density-dependence within the host. Further investigation of the influence of the factors retained within the fitted models, particularly the theme of parasite refugia which was highlighted in relation to both the presence of BZ-resistance alleles and alternative hatching, is required to establish robust, sustainable parasite control and farm management strategies.


Subject(s)
Nematodirus , Sheep Diseases , Strongylida Infections , Animals , Sheep , Nematodirus/genetics , Farms , Strongylida Infections/epidemiology , Strongylida Infections/veterinary , Strongylida Infections/parasitology , Refugium , Sheep Diseases/epidemiology , Sheep Diseases/prevention & control , Sheep Diseases/parasitology , Ovum , Sheep, Domestic , Parasite Egg Count/veterinary , Feces/parasitology
2.
Behav Processes ; 207: 104847, 2023 Apr.
Article in English | MEDLINE | ID: mdl-36801474

ABSTRACT

Domestic herbivores show a strong motivation to form associations with conspecifics and the social dynamics of any group is dependant on the individuals within the group. Thus, common farm management practices such mixing may cause social disruption. Social integration of new group members has previously been defined as a lack of aggressive interactions within the group. However, a lack of aggression among group members may not represent full integration into the social group. Here we observe the impact of disrupting groups of cattle via the introduction of an unfamiliar individual, on the social network patterns of six groups of cattle. Cattle contacts between all individuals in a group were recorded before and after the introduction of the unfamiliar individual. Pre-introduction, resident cattle showed preferential associations with specific individuals in the group. Post-introduction, resident cattle reduced the strength of their contacts (e.g., frequency) with each other relative to the pre-introduction phase. Unfamiliar individuals were socially isolated from the group throughout the trial. The observed social contact patterns suggest that new group members are socially isolated from established groups longer than previously thought, and common farm mixing practices may have negative welfare consequences on introduced individuals.


Subject(s)
Aggression , Social Isolation , Animals , Cattle , Motivation , Herbivory , Social Behavior
3.
Prev Vet Med ; 211: 105808, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36566549

ABSTRACT

Bovine tuberculosis (bTB) is a globally distributed zoonotic disease with significant economic impacts. Control measures in Great Britain include testing for and culling diseased animals. Farmers receive compensation for the value of culled animals, but not for the consequential costs of having to comply with testing and associated control measures. Such uncompensated costs can be significant. We present results of a survey of 1,600 dairy and beef farm holdings conducted in England and Wales to update and improve estimates of these consequential costs.Estimated costs are positively skewed and show considerable variance, which is in agreement with previous, smaller scale surveys of bTB: most farms experiencing bTB incur modest costs but some suffer significant costs. Testing, movement restrictions and output losses account for over three quarters of total uncompensated costs. Total costs rise with herd size and duration of controls. The composition of consequential costs changes as total costs increase, with an increasing proportion of the costs being associated with output losses and movement restrictions, and a decreasing proportion of costs associated with testing costs. Consequential costs tend to be higher for dairy than beef herds but this is likely due to larger herd sizes for dairy.Overall we find the total farm costs of bTB surpass those compensated for by Government in Great Britain. This study contributes to the public-private cost-sharing debate as farmers bear some of the economic burden of a disease breakdown. The methodology and results presented are crucial for informed Government and farmer decision-making. The identification of potential risk factors in this study was challenging but is of relevance outside GB.


Subject(s)
Cattle Diseases , Tuberculosis, Bovine , Cattle , Animals , Tuberculosis, Bovine/epidemiology , Wales/epidemiology , England/epidemiology , United Kingdom , Risk Factors
4.
Front Vet Sci ; 9: 986739, 2022.
Article in English | MEDLINE | ID: mdl-36504845

ABSTRACT

Background: Premature death of livestock is a problem in all ruminant production systems. While the number of premature ruminant deaths in a country is a reasonable indicator for the nation's health, few data sources exist in a country like Ethiopia that can be used to generate valid estimates. The present study aimed to establish if three different data sets, each with imperfect information on ruminant mortality, including abortions, could be combined into improved estimates of nationwide mortality in Ethiopia. Methods: We combined information from a recent survey of ruminant mortality with information from the Living Standards Measurement Study and the Disease Outbreak and Vaccination Reporting dataset. Generalized linear mixed and hurdle models were used for data analysis, with results summarized using predicted outcomes. Results: Analyses indicated that most herds experienced zero mortality and reproductive losses, with rare occasions of larger losses. Diseases causing deaths varied greatly both geographically and over time. There was little agreement between the different datasets. While the models aid the understanding of patterns of mortality and reproductive losses, the degree of variation observed limited the predictive scope. Conclusions: The models revealed some insight into why mortality rates are variable over time and are therefore less useful in measuring production or health status, and it is suggested that alternative measures of productivity, such as number of offspring raised to 1 year old per dam, would be more stable over time and likely more indicative.

5.
Animals (Basel) ; 11(10)2021 Sep 22.
Article in English | MEDLINE | ID: mdl-34679785

ABSTRACT

Piglet castration results in acute pain and stress to the animal. There is a critical need for effective on-farm methods of pain mitigation. Local anaesthesia using Tri-Solfen® (Animal Ethics Pty Ltd., Melbourne, Australia), a topical local anaesthetic and antiseptic formulation instilled to the wound during surgery, is a newly evolving on-farm method to mitigate castration pain. To investigate the efficacy of Tri-Solfen®, instilled to the wound during the procedure, to alleviate subsequent castration-related pain in neonatal piglets, we performed a large, negatively controlled, randomised field trial in two commercial pig farms in Europe. Piglets (173) were enrolled and randomised to undergo castration with or without Tri-Solfen®, instilled to the wound immediately following skin incision. A 30 s wait period was then observed prior to completing castration. Efficacy was investigated by measuring pain-induced motor and vocal responses during the subsequent procedure and post-operative pain-related behaviour in treated versus untreated piglets. There was a significant reduction in nociceptive motor and vocal response during castration and in the post-operative pain-related behaviour response in Tri-Solfen®-treated compared to untreated piglets, in the first 30 min following castration. Although not addressing pain of skin incision, Tri-Solfen® is effective to mitigate subsequent acute castration-related pain in piglets under commercial production conditions.

6.
Vet Rec ; 189(9): e775, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34375447

ABSTRACT

BACKGROUND: Farm management practices have a major impact on nematode population dynamics. The presented study aimed to understand current nematode management practices on UK sheep farms; with a particular focus on Nematodirus battus because of the changing epidemiology, and emerging anthelmintic resistance observed in this species. METHODS: A 42 question online survey covering grazing management, farm demographics and parasite control strategies was developed and distributed to the farming community in 2016. Analysis of the 187 completed questionnaires explored regional variations in practices. RESULTS: Uptake of recommendations was variable, particularly quarantine practices and monitoring tools. Results also highlighted variation in the epidemiology of N. battus; respondents in the north (Scotland, north-west and north-east England) typically reported N. battus in spring with a perception of more severe clinical symptoms than those from the south (Midlands, Wales, south-east and south-west England; p = 0.03). Farms in the south observed greater changes in the timing of disease (p = 0.006) with N. battus being reported throughout the year on some holdings and more frequent use of faecal egg count monitoring (p = 0.006). CONCLUSIONS: Control of N. battus infection is challenging and 'one-size-fits-all' advice is not applicable; however, the information gathered will enable the development of effective, adaptable control strategies.


Subject(s)
Anthelmintics , Nematoda , Nematode Infections , Nematodirus , Sheep Diseases , Animals , Anthelmintics/therapeutic use , Farms , Feces , Nematode Infections/epidemiology , Nematode Infections/prevention & control , Nematode Infections/veterinary , Parasite Egg Count/veterinary , Sheep , Sheep Diseases/drug therapy , Sheep Diseases/epidemiology , Sheep Diseases/prevention & control , Wales
7.
Vet Rec ; 189(1): e28, 2021 07.
Article in English | MEDLINE | ID: mdl-33729562

ABSTRACT

BACKGROUND: Accurate estimation of antimicrobial use (AMU) is important in assessing reduction of agricultural AMU. This cross-sectional study aimed to evaluate several approaches for estimating AMU at the herd level and to report on AMU for beef and dairy farms in Scotland. METHODS: Pharmaceutical sales data for 75 cattle herds (2011-2015) were screened for antimicrobial products and aggregated by herd and year. Several denominators for usage estimates were calculated and compared for their suitability at the herd level. RESULTS: The median total mass of active ingredient sold per kg of bovine livestock was 9.5 mg/kg for beef herds and 14.3 mg/kg for dairy herds. The 'highest priority critically important' antimicrobials (HPCIA) were by total mass of active ingredient, 10.6% of all sales; by total defined daily dose veterinary (DDDVet), 29.8% and by DCDvet, 20.0%. These are the first estimates of AMU for beef cattle in the UK, and for cattle of any kind in Scotland. Estimates of herd-level usage based on population correction unit (PCU) were sensitive to low values for PCU for specific herd-years due to their demographic composition. CONCLUSION: Pharmaceutical sales data can provide useful estimates of AMU, but estimating usage per PCU is not appropriate for comparing groups of cattle with different demographic compositions or for setting herd-level targets. Total mass of active ingredient per kilogram of livestock is more stable and hence suitable than PCU-based methods for assessing AMU at the herd level.


Subject(s)
Anti-Infective Agents/therapeutic use , Commerce/statistics & numerical data , Farms , Veterinary Medicine/economics , Animals , Cattle , Cross-Sectional Studies , Female , Humans , Male , Scotland , United Kingdom
8.
Parasit Vectors ; 13(1): 494, 2020 Sep 29.
Article in English | MEDLINE | ID: mdl-32993770

ABSTRACT

BACKGROUND: Nematodirus battus, unlike most other gastrointestinal nematodes, undergoes maturation to an infective larva within the egg. Historically, eggs were considered to require a period of chilling over winter followed by a period of temperature above 10 °C for synchronous hatching to occur (generally in spring). Anecdotal reports of Nematodirus infection out-with spring in veterinary journals and the farming press suggest that the concentrated pasture abundance of N. battus infective larvae may be changing. In order for control practices to be adapted, and unexpected disease outbreaks to be avoided, it is important to quantify how parasite epidemiology is changing and research the drivers behind it. METHOD: The present study investigated the in vitro hatching response to temperature experiences (with and without a period of chilling) for egg samples of 90 N. battus populations obtained from 73 commercial sheep farms. Six aliquots of larvated eggs were prepared per population, three aliquots were placed at 4 °C for 6  weeks to provide a chill stimulus then incubated at the optimal hatching temperature for the species. The remaining three aliquots of eggs were incubated at the hatching temperature without a prior chill stimulus and the number of hatched larvae was compared between treatments. RESULTS: Median hatch rate across all populations with chilling was 45% (95% CI: 42-48%) and without chilling was 4% (95% CI: 2-6%). Inter-population variation in hatching ranged from 0 to 87% of eggs able to hatch in the absence of a chill stimulus, mean non-chill hatching was 13 ± 2% of eggs (mean ± SE). Non-chill hatching rates were greater than chilled hatching rates in seven of the 90 populations tested. CONCLUSIONS: Clearly, the variation in hatching responses to temperature experience is very large and therefore the seasonality of the parasite may vary not only between regions but also at farm level. In contrast to what previous work has suggested, there was a geographical trend towards higher non-chill hatching in the Northern parts of the UK.


Subject(s)
Nematodirus/growth & development , Strongylida Infections/veterinary , Animals , Female , Larva/growth & development , Male , Ovum/growth & development , Sheep , Sheep Diseases/parasitology , Strongylida Infections/parasitology , Temperature , United Kingdom
9.
Article in English | MEDLINE | ID: mdl-32251964

ABSTRACT

Benzimidazoles (BZ) have been the anthelmintic of choice for controlling Nematodirus battus infections since their release in the 1950s. Despite heavy reliance on this single anthelmintic drug class, resistance was not identified in this nematode until 2010 (Mitchell et al., 2011). The study aimed to explore the prevalence of BZ-resistance mutations in N. battus from UK sheep flocks using deep amplicon sequencing and pyrosequencing platforms. Based on evidence from other gastrointestinal nematodes, resistance in N. battus is likely to be conferred by single nucleotide polymorphisms (SNP) within the ß-tubulin isotype 1 locus at codons 167, 198 and 200. Pyrosequencing and deep amplicon sequencing assays were designed to identify the F167Y (TTC to TAC), E198A (GAA to GCA) and F200Y (TTC to TAC) SNPs. Nematodirus battus populations from 253 independent farms were analysed by pyrosequencing; 174 farm populations were included in deep amplicon sequencing and 170 were analysed using both technologies. F200Y was the most prevalent SNP identified throughout the UK, in 12-27% of the populations tested depending on assay, at a low overall individual frequency of 2.2 ±â€¯0.6% (mean ±â€¯SEM, based on pyrosequencing results). Four out of the five populations with high frequencies (>20%) of the F200Y mutation were located in NW England. The F167Y SNP was identified, for the first time in this species, in four of the populations tested at a low frequency (1.2% ±â€¯0.01), indicating the early emergence of the mutation. E198A or E198L were not identified in any of the isolates. Results obtained were comparable between both techniques for F200Y (Lins' CCC, rc = 0.96) with discrepancies being limited to populations with low frequencies. The recent emergence of resistance in this species will provide a unique opportunity to study the early stages of anthelmintic resistance within a natural setting and track its progress in the future.


Subject(s)
Benzimidazoles/pharmacology , Drug Resistance/genetics , High-Throughput Nucleotide Sequencing/methods , Nematodirus/genetics , Sheep Diseases/parasitology , Strongylida Infections/veterinary , Animals , Anthelmintics/pharmacology , Farms , Feces/parasitology , Gene Frequency , Genotype , Mutation , Nematodirus/drug effects , Sequence Analysis, DNA , Sheep , Sheep Diseases/drug therapy , Sheep Diseases/epidemiology , Strongylida Infections/drug therapy , Strongylida Infections/epidemiology , United Kingdom/epidemiology
10.
Sci Rep ; 9(1): 7208, 2019 05 10.
Article in English | MEDLINE | ID: mdl-31076637

ABSTRACT

Social network analysis has increasingly been considered a useful tool to interpret the complexity of animal social relationships. However, group composition can affect the contact structure of the network resulting in variation between networks. Replication in contact network studies is rarely done but enables determination of possible variation in response across networks. Here we explore the importance of between-group variability in social behaviour and the impact of replication on hypothesis testing. We use an exemplar study of social contact data collected from six replicated networks of cattle before and after the application of a social disturbance treatment. In this replicated study, subtle but consistent changes in animal contact patterns were detected after the application of a social disturbance treatment. We then quantify both within- and between-group variation in this study and explore the importance of varying the number of replicates and the number of individuals within each network, on the precision of the differences in treatment effects for the contact behaviour of the resident cattle. The analysis demonstrates that reducing the number of networks observed in the study would reduce the probability of detecting treatment differences for social behaviours even if the total number of animals was kept the same.


Subject(s)
Behavior, Animal/physiology , Animals , Cattle , Models, Biological , Social Behavior , Social Networking
11.
Vet Parasitol ; 267: 42-46, 2019 Mar.
Article in English | MEDLINE | ID: mdl-30878084

ABSTRACT

Optimisation and use of a device for the on-hen in vivo feeding of all hematophagous stages of Dermanyssus gallinae is described. The sealed mesh device contains the mites and is applied to the skin of the hen's thigh where mites can feed on the bird through a mesh which has apertures large enough to allow the mites' mouth-parts to access to the bird but small enough to contain the mites. By optimising the depth and width of the mesh aperture size we have produced a device which will lead to both reduction and refinement in the use of animals in research, allowing the pre-screening of new vaccines and systemic acaricides/insecticides which have been developed for the control of these blood-feeding parasites before progressing to large field trials. For optimal use, the device should be constructed from 105 µm aperture width, 63 µm depth, polyester mesh and the mites (irrespective of life stage) should be conditioned with no access to food for 3 weeks at 4 °C for optimal feeding and post-feeding survival.


Subject(s)
Animal Welfare , Mite Infestations/prevention & control , Mite Infestations/veterinary , Poultry Diseases/parasitology , Poultry/parasitology , Animal Experimentation , Animal Feed , Animals , Feeding Methods/instrumentation , Mites/physiology
12.
Front Vet Sci ; 5: 192, 2018.
Article in English | MEDLINE | ID: mdl-30159319

ABSTRACT

Active surveillance of rare infectious diseases requires diagnostic tests to have high specificity, otherwise the false positive results can outnumber the true cases detected, leading to low positive predictive values. Where a positive result can have economic consequences, such as the cull of a bovine Tuberculosis (bTB) positive herd, establishing a high specificity becomes particularly important. When evaluating new diagnostic tests against a "gold standard" reference test with assumed perfect sensitivity and specificity, calculation of sample sizes are commonly done using a normal approximation to the binomial distribution, although this approach can be misleading. As the expected specificity of the evaluated diagnostic test nears 100%, the errors arising from this approximation are appreciable. Alternatively, it is straightforward to calculate the sample size by using more appropriate confidence intervals, while precisely quantifying the effect of sampling variability using the binomial distribution. However, regardless of the approach, if specificity is high the sample size required becomes large, and the gold standard may be prohibitively costly. An alternative to a gold standard test is to use at least two imperfect, conditionally independent tests, and to analyse the results using a variant of the approach initially proposed by Hui and Walter. We show how this method performs for tests with near-perfect specificity; in particular we show that the sample size required to deliver useful bounds on the precision becomes very large for both approaches. We illustrate these concepts using simulation studies carried out to support the design of a trial of a bTB vaccine and a diagnostic that is able to "Differentiate Infected and Vaccinated Animals" (DIVA). Both test characteristics and the efficacy of the bTB vaccine will influence the sample size required for the study. We propose an improved methodology using a two stage approach to evaluating diagnostic tests in low disease prevalence populations. By combining an initial gold standard pilot study with a larger study analyzed using a Hui-Walter approach, the sample size required for each study can be reduced and the precision of the specificity estimate improved, since information from both studies is combined.

13.
Front Vet Sci ; 4: 65, 2017.
Article in English | MEDLINE | ID: mdl-28534030

ABSTRACT

Liver fluke infection causes serious disease (fasciolosis) in cattle and sheep in many regions of the world, resulting in production losses and additional economic consequences due to condemnation of the liver at slaughter. Liver fluke depends on mud snails as an intermediate host and infect livestock when ingested through grazing. Therefore, environmental factors play important roles in infection risk and climate change is likely to modify this. Here, we demonstrate how slaughterhouse data can be integrated with other data, including animal movement and climate variables to identify environmental risk factors for liver fluke in cattle in Scotland. We fitted a generalized linear mixed model to the data, with exposure-weighted random and fixed effects, an approach which takes into account the amount of time cattle spent at different locations, exposed to different levels of risk. This enabled us to identify an increased risk of liver fluke with increased animal age, rainfall, and temperature and for farms located further to the West, in excess of the risk associated with a warmer, wetter climate. This model explained 45% of the variability in liver fluke between farms, suggesting that the unexplained 55% was due to factors not included in the model, such as differences in on-farm management and presence of wet habitats. This approach demonstrates the value of statistically integrating routinely recorded slaughterhouse data with other pre-existing data, creating a powerful approach to quantify disease risks in production animals. Furthermore, this approach can be used to better quantify the impact of projected climate change on liver fluke risk for future studies.

14.
PLoS Comput Biol ; 12(7): e1004901, 2016 07.
Article in English | MEDLINE | ID: mdl-27384712

ABSTRACT

Infectious disease surveillance is key to limiting the consequences from infectious pathogens and maintaining animal and public health. Following the detection of a disease outbreak, a response in proportion to the severity of the outbreak is required. It is thus critical to obtain accurate information concerning the origin of the outbreak and its forward trajectory. However, there is often a lack of situational awareness that may lead to over- or under-reaction. There is a widening range of tests available for detecting pathogens, with typically different temporal characteristics, e.g. in terms of when peak test response occurs relative to time of exposure. We have developed a statistical framework that combines response level data from multiple diagnostic tests and is able to 'hindcast' (infer the historical trend of) an infectious disease epidemic. Assuming diagnostic test data from a cross-sectional sample of individuals infected with a pathogen during an outbreak, we use a Bayesian Markov Chain Monte Carlo (MCMC) approach to estimate time of exposure, and the overall epidemic trend in the population prior to the time of sampling. We evaluate the performance of this statistical framework on simulated data from epidemic trend curves and show that we can recover the parameter values of those trends. We also apply the framework to epidemic trend curves taken from two historical outbreaks: a bluetongue outbreak in cattle, and a whooping cough outbreak in humans. Together, these results show that hindcasting can estimate the time since infection for individuals and provide accurate estimates of epidemic trends, and can be used to distinguish whether an outbreak is increasing or past its peak. We conclude that if temporal characteristics of diagnostics are known, it is possible to recover epidemic trends of both human and animal pathogens from cross-sectional data collected at a single point in time.


Subject(s)
Computational Biology/methods , Epidemics/statistics & numerical data , Models, Statistical , Population Surveillance/methods , Algorithms , Animals , Bluetongue , Cattle , Cattle Diseases , Cross-Sectional Studies , Epidemics/prevention & control , Humans , Whooping Cough
15.
Water Res ; 87: 175-81, 2015 Dec 15.
Article in English | MEDLINE | ID: mdl-26408950

ABSTRACT

Waterborne transmission of Toxoplasma gondii is a potential public health risk and there are currently no agreed optimised methods for the recovery, processing and detection of T. gondii oocysts in water samples. In this study modified methods of T. gondii oocyst recovery and DNA extraction were applied to 1427 samples collected from 147 public water supplies throughout Scotland. T. gondii DNA was detected, using real time PCR (qPCR) targeting the 529bp repeat element, in 8.79% of interpretable samples (124 out of 1411 samples). The samples which were positive for T. gondii DNA originated from a third of the sampled water sources. The samples which were positive by qPCR and some of the negative samples were reanalysed using ITS1 nested PCR (nPCR) and results compared. The 529bp qPCR was the more sensitive technique and a full analysis of assay performance, by Bayesian analysis using a Markov Chain Monte Carlo method, was completed which demonstrated the efficacy of this method for the detection of T. gondii in water samples.


Subject(s)
Drinking Water/parasitology , Real-Time Polymerase Chain Reaction/methods , Toxoplasma/isolation & purification , Bayes Theorem , DNA, Protozoan/genetics , Oocysts , Scotland , Toxoplasma/genetics , Water Supply
16.
BMC Vet Res ; 10: 95, 2014 Apr 26.
Article in English | MEDLINE | ID: mdl-24766709

ABSTRACT

BACKGROUND: Escherichia coli (E. coli) O157 is a virulent zoonotic strain of enterohaemorrhagic E. coli. In Scotland (1998-2008) the annual reported rate of human infection is 4.4 per 100,000 population which is consistently higher than other regions of the UK and abroad. Cattle are the primary reservoir. Thus understanding infection dynamics in cattle is paramount to reducing human infections.A large database was created for farms sampled in two cross-sectional surveys carried out in Scotland (1998-2004). A statistical model was generated to identify risk factors for the presence of E. coli O157 on farms. Specific hypotheses were tested regarding the presence of E. coli O157 on local farms and the farms previous status. Pulsed-field gel electrophoresis (PFGE) profiles were further examined to ascertain whether local spread or persistence of strains could be inferred. RESULTS: The presence of an E. coli O157 positive local farm (average distance: 5.96 km) in the Highlands, North East and South West, farm size and the number of cattle moved onto the farm 8 weeks prior to sampling were significant risk factors for the presence of E. coli O157 on farms. Previous status of a farm was not a significant predictor of current status (p = 0.398). Farms within the same sampling cluster were significantly more likely to be the same PFGE type (p < 0.001), implicating spread of strains between local farms. Isolates with identical PFGE types were observed to persist across the two surveys, including 3 that were identified on the same farm, suggesting an environmental reservoir. PFGE types that were persistent were more likely to have been observed in human clinical infections in Scotland (p < 0.001) from the same time frame. CONCLUSIONS: The results of this study demonstrate the spread of E. coli O157 between local farms and highlight the potential link between persistent cattle strains and human clinical infections in Scotland. This novel insight into the epidemiology of Scottish E. coli O157 paves the way for future research into the mechanisms of transmission which should help with the design of control measures to reduce E. coli O157 from livestock-related sources.


Subject(s)
Cattle Diseases/microbiology , Escherichia coli Infections/veterinary , Escherichia coli O157/isolation & purification , Animals , Cattle , Cattle Diseases/epidemiology , Escherichia coli Infections/epidemiology , Escherichia coli Infections/microbiology , Risk Factors , Scotland/epidemiology
17.
Vet Res ; 44: 103, 2013 Oct 31.
Article in English | MEDLINE | ID: mdl-24176040

ABSTRACT

Two ruminant acute phase proteins (APPs), haptoglobin (Hp) and serum amyloid A (SAA), were evaluated as serum biomarkers (BMs) for sheep scab-a highly contagious ectoparasitic disease caused by the mite Psoroptes ovis, which is a major welfare and production threat worldwide. The levels of both APPs increased in serum following experimental infestation of sheep with P. ovis, becoming statistically significantly elevated from pre-infestation levels at 4 weeks post-infestation. Following successful treatment of infested sheep with an endectocide, Hp and SAA serum levels declined rapidly, with half lives of less than 3 days. In contrast, serum IgG levels which specifically bound the P. ovis-derived diagnostic antigen Pso o 2 had a half-life of 56 days. Taking into account pre-infestation serum levels, rapidity of response to infestation and test sensitivity at the estimated optimum cut-off values, SAA was the more discriminatory marker. These studies illustrated the potential of SAA and Hp to indicate current sheep scab infestation status and to augment the existing Pso o 2 serological assay to give disease-specific indications of both infestation and successful treatment.


Subject(s)
Haptoglobins/metabolism , Mite Infestations/veterinary , Psoroptidae/physiology , Serum Amyloid A Protein/metabolism , Sheep Diseases/parasitology , Skin Diseases/veterinary , Acaricides/pharmacology , Animals , Antigens/blood , Biomarkers/blood , Blotting, Western/veterinary , Colorimetry/veterinary , Enzyme-Linked Immunosorbent Assay/veterinary , Female , Ivermectin/analogs & derivatives , Ivermectin/pharmacology , Male , Mite Infestations/drug therapy , Mite Infestations/parasitology , Sheep , Sheep Diseases/drug therapy , Skin Diseases/drug therapy , Skin Diseases/parasitology
18.
Epidemics ; 5(2): 67-76, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23746799

ABSTRACT

The importance of considering coupled interactions across multiple population scales has not previously been studied for highly pathogenic avian influenza (HPAI) in the British commercial poultry industry. By simulating the within-flock transmission of HPAI using a deterministic S-E-I-R model, and by incorporating an additional environmental class representing infectious faeces, we tracked the build-up of infectious faeces within a poultry house over time. A measure of the transmission risk (TR) was computed for each farm by linking the amount of infectious faeces present each day of an outbreak with data describing the daily on-farm visit schedules for a major British catching company. Larger flocks tended to have greater levels of these catching-team visits. However, where density-dependent contact was assumed, faster outbreak detection (according to an assumed mortality threshold) led to a decreased opportunity for catching-team visits to coincide with an outbreak. For this reason, maximum TR-levels were found for mid-range flock sizes (~25,000-35,000 birds). When assessing all factors simultaneously using multivariable linear regression on the simulated outputs, those related to the pattern of catching-team visits had the largest effect on TR, with the most important movement-related factor depending on the mode of transmission. Using social network analysis on a further database to inform a measure of between-farm connectivity, we identified a large fraction of farms (28%) that had both a high TR and a high potential impact at the between farm level. Our results have counter-intuitive implications for between-farm spread that could not be predicted based on flock size alone, and together with further knowledge of the relative importance of transmission risk and impact, could have implications for improved targeting of control measures.


Subject(s)
Agriculture , Disease Outbreaks/veterinary , Influenza in Birds/transmission , Animals , Feces/virology , Humans , Influenza in Birds/epidemiology , Models, Biological , Poultry , Risk , United Kingdom/epidemiology
19.
BMC Infect Dis ; 12: 80, 2012 Apr 01.
Article in English | MEDLINE | ID: mdl-22462563

ABSTRACT

BACKGROUND: Genetic typing data are a potentially powerful resource for determining how infection is acquired. In this paper MLST typing was used to distinguish the routes and risks of infection of humans with Campylobacter jejuni from poultry and ruminant sources METHODS: C. jejuni samples from animal and environmental sources and from reported human cases confirmed between June 2005 and September 2006 were typed using MLST. The STRUCTURE software was used to assign the specific sequence types of the sporadic human cases to a particular source. We then used mixed case-case logistic regression analysis to compare the risk factors for being infected with C. jejuni from different sources. RESULTS: A total of 1,599 (46.3%) cases were assigned to poultry, 1,070 (31.0%) to ruminant and 67 (1.9%) to wild bird sources; the remaining 715 (20.7%) did not have a source that could be assigned with a probability of greater than 0.95. Compared to ruminant sources, cases attributed to poultry sources were typically among adults (odds ratio (OR) = 1.497, 95% confidence intervals (CIs) = 1.211, 1.852), not among males (OR = 0.834, 95% CIs = 0.712, 0.977), in areas with population density of greater than 500 people/km2 (OR = 1.213, 95% CIs = 1.030, 1.431), reported in the winter (OR = 1.272, 95% CIs = 1.067, 1.517) and had undertaken recent overseas travel (OR = 1.618, 95% CIs = 1.056, 2.481). The poultry assigned strains had a similar epidemiology to the unassigned strains, with the exception of a significantly higher likelihood of reporting overseas travel in unassigned strains. CONCLUSIONS: Rather than estimate relative risks for acquiring infection, our analyses show that individuals acquire C. jejuni infection from different sources have different associated risk factors. By enhancing our ability to identify at-risk groups and the times at which these groups are likely to be at risk, this work allows public health messages to be targeted more effectively. The rapidly increasing capacity to conduct genetic typing of pathogens makes such traced epidemiological analysis more accessible and has the potential to substantially enhance epidemiological risk factor studies.


Subject(s)
Campylobacter Infections/epidemiology , Campylobacter Infections/transmission , Campylobacter jejuni/classification , Campylobacter jejuni/genetics , Multilocus Sequence Typing , Zoonoses/epidemiology , Zoonoses/transmission , Adult , Animals , Campylobacter Infections/microbiology , Campylobacter jejuni/isolation & purification , Child , Child, Preschool , Cluster Analysis , Female , Genotype , Humans , Male , Molecular Epidemiology , Poultry , Ruminants , Scotland/epidemiology
20.
Parasit Vectors ; 5: 7, 2012 Jan 10.
Article in English | MEDLINE | ID: mdl-22233730

ABSTRACT

BACKGROUND: Sheep scab is a highly contagious disease of sheep caused by the ectoparasitic mite Psoroptes ovis. The disease is endemic in the UK and has significant economic impact through its effects on performance and welfare. Diagnosis of sheep scab is achieved through observation of clinical signs e.g. itching, pruritis and wool loss and ultimately through the detection of mites in skin scrapings. Early stages of infestation are often difficult to diagnose and sub-clinical animals can be a major factor in disease spread. The development of a diagnostic assay would enable farmers and veterinarians to detect disease at an early stage, reducing the risk of developing clinical disease and limiting spread. METHODS: Serum samples were obtained from an outbreak of sheep scab within an experimental flock (n = 480 (3 samples each from 160 sheep)) allowing the assessment, by ELISA of sheep scab specific antibody prior to infestation, mid-outbreak (combined with clinical assessment) and post-treatment. RESULTS: Analysis of pre-infestation samples demonstrated low levels of potential false positives (3.8%). Of the 27 animals with clinical or behavioural signs of disease 25 tested positive at the mid-outbreak sampling period, however, the remaining 2 sheep tested positive at the subsequent sampling period. Clinical assessment revealed the absence of clinical or behavioural signs of disease in 132 sheep, whilst analysis of mid-outbreak samples showed that 105 of these clinically negative animals were serologically positive, representing potential sub-clinical infestations. CONCLUSIONS: This study demonstrates that this ELISA test can effectively diagnose sheep scab in a natural outbreak of disease, and more importantly, highlights its ability to detect sub-clinically infested animals. This ELISA, employing a single recombinant antigen, represents a major step forward in the diagnosis of sheep scab and may prove to be critical in any future control program.


Subject(s)
Antibodies/blood , Disease Outbreaks/veterinary , Enzyme-Linked Immunosorbent Assay/veterinary , Mite Infestations/veterinary , Psoroptidae/immunology , Sheep Diseases/diagnosis , Animals , Antibodies/metabolism , Antigens , Enzyme-Linked Immunosorbent Assay/methods , False Negative Reactions , False Positive Reactions , Female , Mite Infestations/diagnosis , Mite Infestations/epidemiology , Recombinant Proteins , Sensitivity and Specificity , Serologic Tests , Sheep , Sheep Diseases/epidemiology , Sheep Diseases/parasitology , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...